A modified nonmonotone BFGS algorithm for unconstrained optimization

نویسندگان

  • Xiangrong Li
  • Bopeng Wang
  • Wujie Hu
چکیده

In this paper, a modified BFGS algorithm is proposed for unconstrained optimization. The proposed algorithm has the following properties: (i) a nonmonotone line search technique is used to obtain the step size [Formula: see text] to improve the effectiveness of the algorithm; (ii) the algorithm possesses not only global convergence but also superlinear convergence for generally convex functions; (iii) the algorithm produces better numerical results than those of the normal BFGS method.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Modified Limited Memory Bfgs Method with Nonmonotone Line Search for Unconstrained Optimization

In this paper, we propose two limited memory BFGS algorithms with a nonmonotone line search technique for unconstrained optimization problems. The global convergence of the given methods will be established under suitable conditions. Numerical results show that the presented algorithms are more competitive than the normal BFGS method.

متن کامل

The Global Convergence of Self-Scaling BFGS Algorithm with Nonmonotone Line Search for Unconstrained Nonconvex Optimization Problems

The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessian approximation matrices of the objective function. It has been proved in the literature that this method has the global and superlinear convergence when the objective function is...

متن کامل

The Global Convergence of Self-scale BFGS Algorithm with Nonmonotone Line Search for Unconstrained Nonconvex Optimization Problems

The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessian approximation matrices of the objective function. It has been proved in the literature that this method has the global and superlinear convergence when the objective function is...

متن کامل

The modified BFGS method with new secant relation ‎for unconstrained optimization problems‎

Using Taylor's series we propose a modified secant relation to get a more accurate approximation of the second curvature of the objective function. Then, based on this modified secant relation we present a new BFGS method for solving unconstrained optimization problems. The proposed method make use of both gradient and function values while the usual secant relation uses only gradient values. U...

متن کامل

A Trust-region Method using Extended Nonmonotone Technique for Unconstrained Optimization

In this paper, we present a nonmonotone trust-region algorithm for unconstrained optimization. We first introduce a variant of the nonmonotone strategy proposed by Ahookhosh and Amini cite{AhA 01} and incorporate it into the trust-region framework to construct a more efficient approach. Our new nonmonotone strategy combines the current function value with the maximum function values in some pri...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره 2017  شماره 

صفحات  -

تاریخ انتشار 2017